-
Notifications
You must be signed in to change notification settings - Fork 142
V3/feat/file upload #612
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
V3/feat/file upload #612
Conversation
WalkthroughThis PR adds S3-based file upload infrastructure for book covers. It introduces a MinIO Docker service, configures S3 environment variables, creates an authenticated upload API endpoint with file validation, updates the Book schema to store cover IDs, and integrates file upload UI components into the book cover form workflow. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant FormUI as Form UI
participant API as /api/upload
participant Auth as Auth Service
participant S3 as S3/MinIO
participant DB as Database
User->>FormUI: Select book cover file
FormUI->>API: POST file + auth headers
API->>Auth: Validate session & permissions
alt Not authorized
Auth-->>API: Reject (not admin)
API-->>FormUI: 401/403 error
else Authorized
Auth-->>API: Session valid
API->>S3: Put object (books/{uuid}.{ext})
S3-->>API: objectKey returned
API-->>FormUI: Upload success + objectKey
FormUI->>DB: Update form.coverId
FormUI->>FormUI: Render preview with S3 URL
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes
Possibly related PRs
Suggested reviewers
Pre-merge checks and finishing touches✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 16
🧹 Nitpick comments (22)
app/server/auth.tsx (1)
35-37: Confirm nullability and exposure of profile picture fieldsPrisma has
profilePictureId String?, but here the additional field is declared astype: 'string'without an explicit optional/null config. If BetterAuth treats additional fields as required by default, this could break sign-up. Also, if the UI relies onuser.imageas the canonical URL, you may not need to exposeprofilePictureUrlhere; if you do, add it consistently.Would you like me to check BetterAuth’s expected shape for optional additional fields and propose a consistent mapping (id vs. URL)?
app/components/form/field-text/index.tsx (1)
19-21: Optional: rename/component split for clarity
FieldTextnow acceptstype="file". Consider a dedicatedFieldFileto encapsulate accept/multiple/size validation and previews, leavingFieldTextfor textual inputs.app/components/form/form-field-controller.tsx (1)
82-86: Minor: group simple inputs togetherYou can fold
'file'into the'text'|'email'|'tel'group onceFieldTexthandles the special case internally to keep the switch compact.- case 'text': - case 'email': - case 'tel': - return <FieldText {...props} />; + case 'text': + case 'email': + case 'tel': + case 'file': + return <FieldText {...props} />; @@ - case 'file': - return <FieldText {...props} />;Also applies to: 108-109
prisma/schema.prisma (1)
28-29: Key strategy: prefer stableavatars/${user.id}S3 keyPer prior learning, avatar keys should be
avatars/${user.id}to overwrite on update (avoid storage bloat). If that’s the approach,profilePictureIdcan be derived and may not need storing; otherwise, ensure lifecycle rules delete old keys.Would you like a migration plan to remove
profilePictureUrland rely onimage+ deterministic keying?app/features/user/schema.ts (1)
28-29: Type it consistently and prevent empty stringsConsider
zu.string.nonEmptyNullish(z.string())forprofilePictureIdto avoid persisting empty values. If you keepprofilePictureUrlin Prisma, expose it here asz.string().nullish()or remove it from the schema to avoid confusion.- profilePictureId: z.string().nullish(), + profilePictureId: zu.string.nonEmptyNullish(z.string()), + // profilePictureUrl: z.string().nullish(), // if retained in Prisma and usedapp/locales/ar/account.json (1)
23-31: LGTM: Arabic translations addedStrings are present for all new keys. If you standardize on “update” vs “change” across locales, align later.
app/locales/sw/account.json (1)
24-30: Swahili phrasing/consistency tweaks (“update” vs “improve”).Prefer “sasisha/imesasishwa” to match existing copy (“changeNameDrawer”) and avoid “boreshwa” (improve).
"changeProfilePictureDrawer": { "title": "Badilisha avatar yako", "description": "Fomu ya kubadilisha avatar yako", "label": "Avatar", - "submitButton": "Badilisha", - "successMessage": "Avatar imeboreshwa", - "errorMessage": "Imeshindikana kuboresha avatar yako" + "submitButton": "Sasisha", + "successMessage": "Avatar imesasishwa", + "errorMessage": "Imeshindikana kusasisha avatar yako" }app/env/server.ts (1)
52-53: Typo in comment (“prodution”).- // Check in prodution if the value is not REPLACE ME + // Check in production if the value is not REPLACE ME.env.example (1)
4-15: dotenv-linter nits (quotes/order).To appease the linter: drop unnecessary quotes and reorder keys (non-functional).
package.json (1)
35-35: Makedk:startidempotent for first run
docker compose --profile dev startfails if services weren’t created. Preferup -dso it also creates them when needed.- "dk:start": "docker compose --profile dev start", + "dk:start": "docker compose --profile dev up -d",app/lib/object-storage/index.ts (1)
5-10: LGTM; MinIO client correctly parameterizedLooks consistent with
better-upload’s MinIO helper; credentials and endpoint come from validated env.If you ever point to AWS S3, ensure
regionis a real AWS region and consider exposingforcePathStyletoggle (MinIO needs it; AWS doesn’t).app/features/account/schema.ts (1)
13-19: Align validator helper for consistencyTo match the rest of the file, use
zu.string.nonEmptyforprofilePicture.export const zFormFieldsAccountUpdateProfilePicture = () => z.object({ - profilePicture: z.string().min(1), + profilePicture: zu.string.nonEmpty(z.string()), });Optional: name the field
profilePictureIdto reflect what the backend expects, and disable submit until an upload succeeds.app/features/account/user-card.tsx (1)
25-27: Remove debug logExtraneous console log on button click; the drawer trigger already handles opening.
- onClick={() => { - console.log('update picture'); - }} + onClick={() => {}}Or drop the
onClickentirely.</blockquote></details> <details> <summary>app/server/routers/account.tsx (2)</summary><blockquote> `69-71`: **Resolve TODO: return proper errors and/or separate RPC** Swallowing failures hides issues from the client. Either: - bubble up a 5xx/structured error, or - split avatar update into a dedicated RPC and return success/failure explicitly. I can refactor this into `account.updateAvatar` with clear error handling if helpful. --- `87-92`: **Avoid clearing name implicitly** `name: input.name ?? ''` will wipe the name if `name` is omitted. Prefer keeping the existing value if undefined. ```diff - name: input.name ?? '', + ...(typeof input.name === 'string' ? { name: input.name } : {}),app/features/account/change-profile-picture-drawer.tsx (5)
42-47: Use defaultValues instead of values to avoid controlled re-renders (esp. with ).react-hook-form recommends defaultValues for initial state; values turns the form into a fully controlled source-of-truth and can cause unexpected resets or warnings with file inputs.
- const form = useForm<FormFieldsAccountUpdateProfilePicture>({ - resolver: zodResolver(zFormFieldsAccountUpdateProfilePicture()), - values: { - profilePicture: '', - }, - }); + const form = useForm<FormFieldsAccountUpdateProfilePicture>({ + resolver: zodResolver(zFormFieldsAccountUpdateProfilePicture()), + defaultValues: { + profilePicture: '', + }, + });
83-96: Preserve existing query params when toggling the drawer.Your navigate() overwrites the entire search object. Merge instead to avoid clobbering unrelated params.
- onOpenChange={(open) => { - form.reset(); - router.navigate({ - replace: true, - to: '.', - search: { - state: open ? 'change-profile-picture' : '', - }, - }); - }} + onOpenChange={(open) => { + form.reset(); + router.navigate({ + replace: true, + to: '.', + search: (prev) => ({ ...prev, state: open ? 'change-profile-picture' : '' }), + }); + }}
119-139: Harden the file input UX: accept filter, size/type guard, avoid autofocus on mobile.
- Add accept="image/*" (or a stricter list).
- Enforce a size limit and show a friendly error.
- Autofocus on a file input can trigger the picker immediately; consider removing.
<FormFieldController control={form.control} type="file" name="profilePicture" size="lg" + accept="image/png,image/jpeg,image/webp" onChange={(e) => { - if (e.target.files?.[0]) { - upload(e.target.files[0], { + const f = e.target.files?.[0]; + if (f) { + const MAX = 5 * 1024 * 1024; // 5MB + if (!/^image\/(png|jpe?g|webp)$/.test(f.type) || f.size > MAX) { + form.setError('profilePicture', { message: t('account:changeProfilePictureDrawer.invalidFile') || 'Invalid file' }); + return; + } + upload(f, { metadata: { userId: session.data?.user.id, }, }); } }} - autoFocus />
142-149: Disable the submit button until an upload key exists; keep loading state as-is.Prevents accidental empty submits.
<Button type="submit" className="w-full" size="lg" loading={updateUser.isPending || isPending} + disabled={!uploadedFile?.objectKey || updateUser.isPending || isPending} >
64-81: Leverage server-side keying to avoid storage bloat.Ensure the upload route stores avatar at avatars/${user.id} (as per team convention) and update the returned key accordingly, so subsequent uploads overwrite the previous image.
This aligns with our prior choice to overwrite avatars to avoid orphaned objects. If the server already enforces this, great—just confirm the returned key matches the convention.
docker-compose.yml (2)
37-53: Optional: avoid public bucket in dev by returning signed URLs.Granting anonymous download simplifies dev but normalizes a risky pattern. Consider removing anonymous access and using signed URLs even in dev.
- mc anonymous set download default/${S3_BUCKET_NAME}; + # Consider using presigned URLs instead of anonymous access in dev. + # mc anonymous set download default/${S3_BUCKET_NAME};
19-36: Pin by digest for reproducibility.The floating tag may change over time. Pinning to a sha256 digest makes CI/dev deterministic.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (21)
.env.example(2 hunks)app/components/form/field-text/index.tsx(1 hunks)app/components/form/form-field-controller.tsx(1 hunks)app/env/server.ts(1 hunks)app/features/account/change-profile-picture-drawer.tsx(1 hunks)app/features/account/schema.ts(1 hunks)app/features/account/user-card.tsx(2 hunks)app/features/user/schema.ts(1 hunks)app/lib/object-storage/index.ts(1 hunks)app/locales/ar/account.json(1 hunks)app/locales/en/account.json(1 hunks)app/locales/fr/account.json(1 hunks)app/locales/sw/account.json(1 hunks)app/routeTree.gen.ts(11 hunks)app/routes/api/upload.ts(1 hunks)app/routes/app/_layout/account.index.tsx(1 hunks)app/server/auth.tsx(1 hunks)app/server/routers/account.tsx(2 hunks)docker-compose.yml(2 hunks)package.json(3 hunks)prisma/schema.prisma(1 hunks)
🧰 Additional context used
🧠 Learnings (1)
📓 Common learnings
Learnt from: yoannfleurydev
PR: BearStudio/start-ui-web#443
File: src/lib/s3/config.ts:13-13
Timestamp: 2025-02-21T08:38:40.338Z
Learning: In the avatar upload feature, the S3 key is intentionally set to `avatars/${user.id}` to ensure that new avatar uploads overwrite the previous one, preventing storage bloat and simplifying avatar management.
🧬 Code graph analysis (6)
app/components/form/form-field-controller.tsx (1)
app/components/form/field-text/index.tsx (1)
FieldText(24-82)
app/features/account/change-profile-picture-drawer.tsx (5)
app/features/auth/client.ts (1)
authClient(13-22)app/features/account/schema.ts (2)
FormFieldsAccountUpdateProfilePicture(13-15)zFormFieldsAccountUpdateProfilePicture(16-19)app/lib/orpc/client.ts (1)
orpc(24-24)app/components/ui/responsive-drawer.tsx (8)
ResponsiveDrawer(34-40)ResponsiveDrawerTrigger(42-49)ResponsiveDrawerContent(78-85)ResponsiveDrawerHeader(87-94)ResponsiveDrawerTitle(114-121)ResponsiveDrawerDescription(123-130)ResponsiveDrawerBody(96-103)ResponsiveDrawerFooter(105-112)app/components/form/form-field-controller.tsx (1)
FormFieldController(64-131)
app/lib/object-storage/index.ts (1)
app/env/server.ts (1)
envServer(9-44)
app/features/account/user-card.tsx (1)
app/features/account/change-profile-picture-drawer.tsx (1)
ChangeProfilePictureDrawer(36-155)
app/routes/api/upload.ts (2)
app/lib/object-storage/index.ts (1)
s3client(5-10)app/env/server.ts (1)
envServer(9-44)
app/server/routers/account.tsx (2)
app/env/server.ts (1)
envServer(9-44)app/lib/object-storage/index.ts (1)
s3client(5-10)
🪛 GitHub Actions: 🧪 E2E Tests
app/env/server.ts
[error] 9-9: Environment validation failed during 'pnpm db:seed': Missing required environment variables: DOCKER_MINIO_USERNAME, DOCKER_MINIO_PASSWORD, S3_ACCESS_KEY_ID, S3_SECRET_ACCESS_KEY, S3_BUCKET_PUBLIC_URL, S3_ENDPOINT.
🪛 GitHub Check: 🧹 Linter
app/routes/api/upload.ts
[warning] 17-17:
Complete the task associated to this "TODO" comment
app/server/routers/account.tsx
[warning] 69-69:
Complete the task associated to this "TODO" comment
🪛 dotenv-linter (3.3.0)
.env.example
[warning] 4-4: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 5-5: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 6-6: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 7-7: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 7-7: [UnorderedKey] The DOCKER_MINIO_PASSWORD key should go before the DOCKER_MINIO_UI_PORT key
(UnorderedKey)
[warning] 11-11: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 11-11: [UnorderedKey] The S3_BUCKET_NAME key should go before the S3_ENDPOINT key
(UnorderedKey)
[warning] 13-13: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 14-14: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 15-15: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 15-15: [UnorderedKey] The S3_REGION key should go before the S3_SECRET_ACCESS_KEY key
(UnorderedKey)
[warning] 18-18: [DuplicatedKey] The S3_ENDPOINT key is duplicated
(DuplicatedKey)
[warning] 19-19: [DuplicatedKey] The S3_BUCKET_NAME key is duplicated
(DuplicatedKey)
[warning] 19-19: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 19-19: [UnorderedKey] The S3_BUCKET_NAME key should go before the S3_ENDPOINT key
(UnorderedKey)
[warning] 20-20: [DuplicatedKey] The S3_BUCKET_PUBLIC_URL key is duplicated
(DuplicatedKey)
[warning] 21-21: [DuplicatedKey] The S3_ACCESS_KEY_ID key is duplicated
(DuplicatedKey)
[warning] 21-21: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 22-22: [DuplicatedKey] The S3_SECRET_ACCESS_KEY key is duplicated
(DuplicatedKey)
[warning] 22-22: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 23-23: [DuplicatedKey] The S3_REGION key is duplicated
(DuplicatedKey)
[warning] 23-23: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 23-23: [UnorderedKey] The S3_REGION key should go before the S3_SECRET_ACCESS_KEY key
(UnorderedKey)
🔇 Additional comments (9)
app/components/form/form-field-controller.tsx (1)
108-109: File case renders via FieldText — ensure special handling existsRouting
type="file"toFieldTextis fine as long as the component omitsvalueand forwardsFile|File[]to RHF (see suggested fix in FieldText). Otherwise, this will misbehave.app/routes/app/_layout/account.index.tsx (1)
11-11: LGTM: search state extended for profile picture flowThe new
'change-profile-picture'state is correctly validated and defaults remain unchanged.app/locales/en/account.json (1)
23-31: LGTM: new i18n keys for profile picture drawerKeys cover title, description, label, CTA, and toast messages. Consistent with existing naming.
app/locales/fr/account.json (1)
24-30: FR translations look good and consistent with existing keys.app/routeTree.gen.ts (1)
39-39: Generated route wiring for /api/upload looks correct.Also applies to: 182-186, 386-387, 397-398, 409-410, 420-446, 456-456, 646-652, 840-841
package.json (1)
44-44: Verify pinned deps & advisoriespackage.json missing in sandbox — couldn't verify pins. npm registry: [email protected]; @aws-sdk/[email protected]. GitHub advisories query returned no advisories for better-upload. File: package.json (lines 44, 66). Confirm package.json is present, validate the pinned versions, and run npm audit / GH advisory checks before merging.
app/features/account/user-card.tsx (1)
21-49: Nice UX additionWrapping the header in
ChangeProfilePictureDraweris clean and discoverable.app/server/routers/account.tsx (1)
53-55: Keying strategy: use deterministic per-user key (avatars/${context.user.id})Frontend saves uploadedFile?.objectKey into profilePictureId (app/features/account/change-profile-picture-drawer.tsx:106); I could not find where objectKey is generated — ensure the upload backend/service (or src/lib/s3 config) writes the object as avatars/${context.user.id} (overwriting the previous file) to prevent bucket bloat.
app/features/account/change-profile-picture-drawer.tsx (1)
49-63: Confirmed — useUploadFile returns objectKey; no change required.
Docs show the uploaded file object includes objectKey, so uploadedFile.objectKey is correct (applies to lines 49–63 and 101–108).
| type: 'text' | 'email' | 'tel' | 'file'; | ||
| containerProps?: ComponentProps<'div'>; | ||
| } & ComponentProps<typeof Input> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
File input is incorrectly controlled; pass files to RHF and omit value
For type="file", spreading ...field injects a value prop and forwards the raw event, which triggers React warnings and prevents RHF from receiving the File/File[]. Handle the file case specially.
Apply:
@@
- defaultValue={defaultValue}
+ defaultValue={type === 'file' ? undefined : defaultValue}
@@
- <Input
- type={type}
- id={ctx.id}
- aria-invalid={fieldState.error ? true : undefined}
- aria-describedby={
- !fieldState.error
- ? `${ctx.descriptionId}`
- : `${ctx.descriptionId} ${ctx.errorId}`
- }
- {...rest}
- {...field}
- onChange={(e) => {
- field.onChange(e);
- rest.onChange?.(e);
- }}
- onBlur={(e) => {
- field.onBlur();
- rest.onBlur?.(e);
- }}
- />
+ <Input
+ type={type}
+ id={ctx.id}
+ aria-invalid={fieldState.error ? true : undefined}
+ aria-describedby={
+ !fieldState.error
+ ? `${ctx.descriptionId}`
+ : `${ctx.descriptionId} ${ctx.errorId}`
+ }
+ {...rest}
+ // Avoid controlling file inputs; only pass minimal RHF bindings
+ name={field.name}
+ ref={field.ref}
+ {...(type === 'file'
+ ? {
+ onChange: (e: React.ChangeEvent<HTMLInputElement>) => {
+ const files = Array.from(e.target.files ?? []);
+ field.onChange(rest.multiple ? files : files[0] ?? null);
+ rest.onChange?.(e);
+ },
+ onBlur: (e: React.FocusEvent<HTMLInputElement>) => {
+ field.onBlur();
+ rest.onBlur?.(e);
+ },
+ }
+ : {
+ value: field.value,
+ onChange: (e: React.ChangeEvent<HTMLInputElement>) => {
+ field.onChange(e);
+ rest.onChange?.(e);
+ },
+ onBlur: (e: React.FocusEvent<HTMLInputElement>) => {
+ field.onBlur();
+ rest.onBlur?.(e);
+ },
+ })}
+ />Also applies to: 57-76
🤖 Prompt for AI Agents
In app/components/form/field-text/index.tsx around lines 19-21 (and also apply
the same fix to lines 57-76), the component currently spreads ...field into the
Input which for type="file" passes a value prop and the raw event, causing React
warnings and preventing React Hook Form from receiving File/File[]; modify the
component to handle type==="file" specially by omitting the field.value when
spreading, and wire onChange to extract files (e.g., e.target.files) and call
field.onChange with the File or File[] and call field.onBlur as appropriate;
ensure you also do not pass a value prop to the native file input and keep other
props (name, ref) intact so RHF correctly registers the files.
| DOCKER_MINIO_API_PORT: z.string().default('9000'), | ||
| DOCKER_MINIO_UI_PORT: z.string().default('9001'), | ||
| DOCKER_MINIO_USERNAME: z.string(), | ||
| DOCKER_MINIO_PASSWORD: z.string(), | ||
| S3_ACCESS_KEY_ID: z.string(), | ||
| S3_SECRET_ACCESS_KEY: z.string(), | ||
| S3_BUCKET_NAME: z.string().default('default'), | ||
| S3_REGION: z.string().default('default'), | ||
| S3_BUCKET_PUBLIC_URL: z.string(), | ||
| S3_ENDPOINT: z.string(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Env validation breaks CI; make S3/MinIO vars optional in non‑prod and validate as URLs.
E2E fails due to required S3/MinIO vars during db:seed (not needed there). Provide non‑prod defaults and use .url() for URL-ish vars.
- DOCKER_MINIO_API_PORT: z.string().default('9000'),
- DOCKER_MINIO_UI_PORT: z.string().default('9001'),
- DOCKER_MINIO_USERNAME: z.string(),
- DOCKER_MINIO_PASSWORD: z.string(),
- S3_ACCESS_KEY_ID: z.string(),
- S3_SECRET_ACCESS_KEY: z.string(),
- S3_BUCKET_NAME: z.string().default('default'),
- S3_REGION: z.string().default('default'),
- S3_BUCKET_PUBLIC_URL: z.string(),
- S3_ENDPOINT: z.string(),
+ DOCKER_MINIO_API_PORT: z.string().default('9000'),
+ DOCKER_MINIO_UI_PORT: z.string().default('9001'),
+ DOCKER_MINIO_USERNAME: isProd ? z.string() : z.string().default('startui'),
+ DOCKER_MINIO_PASSWORD: isProd ? z.string() : z.string().default('startui'),
+ S3_ACCESS_KEY_ID: isProd ? z.string() : z.string().default('startui-access-key'),
+ S3_SECRET_ACCESS_KEY: isProd ? z.string() : z.string().default('startui-secret-key'),
+ S3_BUCKET_NAME: z.string().default('start-ui-bucket'),
+ S3_REGION: z.string().default('us-east-1'),
+ S3_BUCKET_PUBLIC_URL: (isProd ? z.string().url() : z.string().url().default('http://127.0.0.1:9000/start-ui-bucket')),
+ S3_ENDPOINT: (isProd ? z.string().url() : z.string().url().default('http://127.0.0.1:9000')),If you prefer stricter gating, require these only when FILE_UPLOADS_ENABLED=true. I can draft that follow-up.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| DOCKER_MINIO_API_PORT: z.string().default('9000'), | |
| DOCKER_MINIO_UI_PORT: z.string().default('9001'), | |
| DOCKER_MINIO_USERNAME: z.string(), | |
| DOCKER_MINIO_PASSWORD: z.string(), | |
| S3_ACCESS_KEY_ID: z.string(), | |
| S3_SECRET_ACCESS_KEY: z.string(), | |
| S3_BUCKET_NAME: z.string().default('default'), | |
| S3_REGION: z.string().default('default'), | |
| S3_BUCKET_PUBLIC_URL: z.string(), | |
| S3_ENDPOINT: z.string(), | |
| DOCKER_MINIO_API_PORT: z.string().default('9000'), | |
| DOCKER_MINIO_UI_PORT: z.string().default('9001'), | |
| DOCKER_MINIO_USERNAME: isProd ? z.string() : z.string().default('startui'), | |
| DOCKER_MINIO_PASSWORD: isProd ? z.string() : z.string().default('startui'), | |
| S3_ACCESS_KEY_ID: isProd ? z.string() : z.string().default('startui-access-key'), | |
| S3_SECRET_ACCESS_KEY: isProd ? z.string() : z.string().default('startui-secret-key'), | |
| S3_BUCKET_NAME: z.string().default('start-ui-bucket'), | |
| S3_REGION: z.string().default('us-east-1'), | |
| S3_BUCKET_PUBLIC_URL: (isProd ? z.string().url() : z.string().url().default('http://127.0.0.1:9000/start-ui-bucket')), | |
| S3_ENDPOINT: (isProd ? z.string().url() : z.string().url().default('http://127.0.0.1:9000')), |
🤖 Prompt for AI Agents
In app/env/server.ts around lines 30 to 39, the current zod schema makes
S3/MinIO env vars required which breaks CI; update the schema to make S3/MinIO
variables optional or provide sensible non‑prod defaults (e.g., default empty
string or default ports) so db:seed and other non‑upload workflows don't fail,
and change URL-like fields (S3_BUCKET_PUBLIC_URL, S3_ENDPOINT) to use
z.string().url() (or z.string().optional().url()) to validate they are valid
URLs; alternatively (preferred for stricter control) only require these vars
when FILE_UPLOADS_ENABLED=true by gating their .required() validation behind
that flag.
| import { | ||
| type FormFieldsAccountUpdateProfilePicture, | ||
| zFormFieldsAccountUpdateProfilePicture, | ||
| } from '@/features/account/schema'; | ||
| import { authClient } from '@/features/auth/client'; | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Schema/field type mismatch: profilePicture is a file input but validated as string.
You never set the field value, so zod min(1) will fail and block submit. Either bind the selected file name to the field or, better, validate the presence of an uploadedFile before submit and remove the string constraint.
Minimal in-component guard (keeps schema unchanged but avoids undefined payload):
- onSubmit={async ({ profilePicture }) => {
- console.log('form', { profilePicture, uploadedFile });
- updateUser.mutate({ profilePictureId: uploadedFile?.objectKey });
- }}
+ onSubmit={async () => {
+ if (!uploadedFile?.objectKey) {
+ form.setError('profilePicture', {
+ message: t('account:changeProfilePictureDrawer.missingFile') || 'Please select a picture',
+ });
+ return;
+ }
+ updateUser.mutate({ profilePictureId: uploadedFile.objectKey });
+ }}Preferable follow-up: update zFormFieldsAccountUpdateProfilePicture() to accept a File/FileList (or make it optional) and rely on the uploadedFile presence instead of a string length check. Do you want a patch for the schema too?
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| import { | |
| type FormFieldsAccountUpdateProfilePicture, | |
| zFormFieldsAccountUpdateProfilePicture, | |
| } from '@/features/account/schema'; | |
| import { authClient } from '@/features/auth/client'; | |
| import { | |
| type FormFieldsAccountUpdateProfilePicture, | |
| zFormFieldsAccountUpdateProfilePicture, | |
| } from '@/features/account/schema'; | |
| import { authClient } from '@/features/auth/client'; | |
| onSubmit={async () => { | |
| if (!uploadedFile?.objectKey) { | |
| form.setError('profilePicture', { | |
| message: t('account:changeProfilePictureDrawer.missingFile') || 'Please select a picture', | |
| }); | |
| return; | |
| } | |
| updateUser.mutate({ profilePictureId: uploadedFile.objectKey }); | |
| }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- demo mode block upload
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
package.json (1)
34-34: Makedk:initmore reliable by waiting for services to be healthyCompose supports
--waitto block until healthchecks pass. This avoids races when MinIO is needed immediately after init.- "dk:init": "docker compose --profile init up -d", + "dk:init": "docker compose --profile init up -d --wait",
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
docker-compose.yml(1 hunks)package.json(3 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- docker-compose.yml
🧰 Additional context used
🧠 Learnings (1)
📓 Common learnings
Learnt from: yoannfleurydev
PR: BearStudio/start-ui-web#443
File: src/lib/s3/config.ts:13-13
Timestamp: 2025-02-21T08:38:40.338Z
Learning: In the avatar upload feature, the S3 key is intentionally set to `avatars/${user.id}` to ensure that new avatar uploads overwrite the previous one, preventing storage bloat and simplifying avatar management.
🔇 Additional comments (1)
package.json (1)
44-44: No action required — @aws-sdk/client-s3 is only imported in server code. Import found at app/server/routers/account.tsx:1; no matches under app/components, app/features, or app/routes.
package.json
Outdated
| "@tanstack/zod-adapter": "1.121.34", | ||
| "@uidotdev/usehooks": "2.4.1", | ||
| "better-auth": "1.2.7", | ||
| "better-upload": "1.1.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Harden file uploads (limits, type checks, storage keys) with better-upload
With uploads enabled, enforce server-side protections and consistent keys.
- Enforce max file size, file count, and accepted MIME types using magic-bytes detection (not just extensions).
- Reject MIME/extension mismatches and strip metadata; consider basic image transforms for avatars.
- Rate-limit and set per-user quotas; add timeouts and backpressure.
- Store avatars under a stable key to overwrite prior uploads (per your retrieved learning: use
avatars/${user.id}); confirm this is what your uploader uses. - Log upload attempts and surface errors via i18n.
Quick checks:
If missing, I can draft the upload handler with strict limits and S3 keying.
🏁 Script executed:
#!/bin/bash
# Look for upload config knobs; ensure they exist and are non-trivial.
rg -nP -C2 '\b(max(File)?Size|maxBytes|fileSize|maxFiles|allowed(Mime|Types)|content[-_ ]?type|magic|sniff)\b' app || true
# Ensure avatar S3 keys follow the overwrite pattern.
rg -nP -C2 'avatars/\$\{?user\.id\}?|avatars/.+user\.id' app || trueLength of output: 337
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "=== show upload.ts ==="
nl -ba app/routes/api/upload.ts || true
echo
echo "=== search for explicit avatars/${user.id} key patterns ==="
rg -nP -C2 'avatars/\$\{?user\.id\}?|avatars/.+user\.id' -S || true
echo
echo "=== search for general 'avatar' occurrences ==="
rg -n -C2 '\bavatar\b' || true
echo
echo "=== search for S3 / storage usage ==="
rg -nP -C2 '\b(S3Client|S3|s3|putObject|PutObjectCommand|uploadFile|upload_to_s3|uploadToS3|createPresignedPost|presign|presigned|bucket|put_object)\b' -S || true
echo
echo "=== search for file-type / magic-bytes detection libs ==="
rg -nP -C2 '\b(file-type|filetype|magic|magic-bytes|detect-file-type|file-type-from-buffer|fromBuffer)\b' -S || true
echo
echo "=== search for image transforms / metadata stripping ==="
rg -nP -C2 'sharp\(|jimp|imagemagick|gm\(|exif|stripMetadata|strip_meta|removeMetadata' -S || true
echo
echo "=== search for rate-limit / quota / timeout hints ==="
rg -nP -C2 'rate[-_ ]?limit|quota|per[-_ ]?user|throttle|backpressure|timeout' -S || trueLength of output: 161
🏁 Script executed:
#!/bin/bash
# safer re-check of upload handler and related patterns (avoid nl, avoid unescaped ${})
set -e
file="app/routes/api/upload.ts"
if [ -f "$file" ]; then
printf "=== show %s ===\n" "$file"
awk '{printf("%5d %s\n", NR, $0)}' "$file"
else
printf "NOT FOUND: %s\n" "$file"
fi
printf "\n=== search for explicit avatars/\${user.id} key patterns ===\n"
rg -nP -C2 'avatars/\$\{?user\.id\}?|avatars/.+user\.id' -S || true
printf "\n=== search for general 'avatar' occurrences ===\n"
rg -n -C2 '\bavatar\b' -S || true
printf "\n=== search for S3 / storage usage ===\n"
rg -nP -C2 '\b(S3Client|S3|s3|putObject|PutObjectCommand|uploadFile|upload_to_s3|uploadToS3|createPresignedPost|presign|presigned|bucket|put_object)\b' -S || true
printf "\n=== search for file-type / magic-bytes detection libs ===\n"
rg -nP -C2 '\b(file-type|filetype|magic|magic-bytes|detect-file-type|file-type-from-buffer|fromBuffer)\b' -S || true
printf "\n=== search for image transforms / metadata stripping ===\n"
rg -nP -C2 'sharp\(|jimp|imagemagick|gm\(|exif|stripMetadata|strip_meta|removeMetadata' -S || true
printf "\n=== search for rate-limit / quota / timeout hints ===\n"
rg -nP -C2 'rate[-_ ]?limit|quota|per[-_ ]?user|throttle|backpressure|timeout' -S || trueLength of output: 39468
Harden avatar uploads: add content‑sniffing, stable S3 keying, auth, quotas & metadata stripping
Found: app/routes/api/upload.ts — avatar route sets fileTypes: ['image/*'] and maxFileSize: 4MB, but onBeforeUpload is TODO and there’s no explicit avatar S3 key/config.
Actionable fixes:
- Add server-side magic-bytes/content-sniffing (reject MIME/extension mismatches).
- Strip EXIF/metadata and apply basic image transforms (resize/normalize) for avatars.
- Implement auth/authorization in onBeforeUpload and set the object key to avatars/${user.id} so uploads overwrite previous avatars (coordinate with app/server/routers/account.tsx where DeleteObjectCommand is used).
- Add per-user rate-limits/quotas and request timeouts/backpressure.
- Log upload attempts and surface i18n error keys (account locale files already contain avatar messages).
🤖 Prompt for AI Agents
package.json around line 66: The avatar upload endpoint needs
hardening—implement server-side content-sniffing to validate magic bytes vs
declared MIME/extension and reject mismatches; in onBeforeUpload enforce
auth/authorization, per-user rate-limits/quotas and request
timeouts/backpressure, and map errors to existing i18n keys; process uploaded
images to strip EXIF/metadata and apply basic transforms (resize/normalize)
before storage; and use a stable S3 object key pattern avatars/${user.id} so
uploads overwrite previous avatars (coordinate with account deletion code), plus
add structured logging for all upload attempts including outcomes and reasons.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
app/server/routers/account.tsx (1)
85-91: Don’t blank name when updating only avatar; update name conditionallyCurrently, if you remove the early
return, the unconditional update would setnameto''when not provided. Make it conditional:- await context.db.user.update({ - where: { id: context.user.id }, - data: { - name: input.name ?? '', - }, - }); + if (input.name !== undefined) { + await context.db.user.update({ + where: { id: context.user.id }, + data: { name: input.name ?? '' }, + }); + }
♻️ Duplicate comments (2)
app/features/account/user-card.tsx (1)
20-22: Fix AvatarImage src; avoid "undefined" URL and normalize joinConstruct the URL only when both base and id exist; otherwise pass
undefinedso no request is made. Also join vianew URLto avoid double slashes.- const getImageUrl = () => { - return `${envClient.VITE_S3_BUCKET_PUBLIC_URL}/${session.data?.user.avatarFileId}`; - }; + const avatarSrc = + session.data?.user.avatarFileId && envClient.VITE_S3_BUCKET_PUBLIC_URL + ? new URL( + session.data.user.avatarFileId, + envClient.VITE_S3_BUCKET_PUBLIC_URL + ).toString() + : undefined;- {session.data?.user.avatarFileId ? ( - <AvatarImage - src={getImageUrl()} - alt={session.data?.user.name} - /> - ) : ( + {session.data?.user.avatarFileId ? ( + <AvatarImage + src={avatarSrc} + alt={session.data?.user.name ?? 'Avatar'} + /> + ) : (Also applies to: 31-35
app/server/routers/account.tsx (1)
53-67: Guard S3 delete and fix log field names; delete only when key changedAvoid sending DeleteObject with an undefined key and skip deletion when the new key matches the old one.
- // If profilePictureId is defined, check to delete the old one - if (input.avatarFileId) { - // Remove old file if there was one - const deleteCommand = new DeleteObjectCommand({ - Bucket: envServer.S3_BUCKET_NAME, - Key: context.user.avatarFileId, - }); - try { - await s3client.send(deleteCommand); - } catch (error) { - context.logger.warn('Fail to delete user profile picture', { - profilePictureId: context.user.avatarFileId, - error, - }); - } + if (input.avatarFileId) { + if ( + context.user.avatarFileId && + context.user.avatarFileId !== input.avatarFileId + ) { + const deleteCommand = new DeleteObjectCommand({ + Bucket: envServer.S3_BUCKET_NAME, + Key: context.user.avatarFileId, + }); + try { + await s3client.send(deleteCommand); + } catch (error) { + context.logger.warn('Fail to delete user avatar', { + avatarFileId: context.user.avatarFileId, + error, + }); + } + }
🧹 Nitpick comments (1)
app/server/routers/account.tsx (1)
69-82: Persist new avatarFileId (ok) — align logs with new naming- } catch (error) { - context.logger.warn('Fail to save user profile picture', { - profilePictureId: context.user.avatarFileId, - error, - }); - } - return; + } catch (error) { + context.logger.warn('Fail to save user avatar', { + avatarFileId: input.avatarFileId, + error, + }); + }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (11)
app/env/client.ts(1 hunks)app/env/server.ts(1 hunks)app/features/account/change-profile-picture-drawer.tsx(1 hunks)app/features/account/schema.ts(1 hunks)app/features/account/user-card.tsx(1 hunks)app/features/user/schema.ts(1 hunks)app/lib/s3/index.ts(1 hunks)app/routes/api/upload.ts(1 hunks)app/server/auth.tsx(1 hunks)app/server/routers/account.tsx(2 hunks)prisma/schema.prisma(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (5)
- app/server/auth.tsx
- app/env/server.ts
- app/features/account/change-profile-picture-drawer.tsx
- prisma/schema.prisma
- app/features/account/schema.ts
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: yoannfleurydev
PR: BearStudio/start-ui-web#443
File: src/lib/s3/config.ts:13-13
Timestamp: 2025-02-21T08:38:40.338Z
Learning: In the avatar upload feature, the S3 key is intentionally set to `avatars/${user.id}` to ensure that new avatar uploads overwrite the previous one, preventing storage bloat and simplifying avatar management.
📚 Learning: 2024-10-14T15:29:53.279Z
Learnt from: ivan-dalmet
PR: BearStudio/start-ui-web#532
File: src/server/config/oauth/providers/discord.ts:11-11
Timestamp: 2024-10-14T15:29:53.279Z
Learning: In `src/server/config/oauth/providers/discord.ts`, when defining the `zDiscordUser` schema, keep fields like `username` as nullable (e.g., `.nullish()`), as the Discord API might return null values despite the documentation stating otherwise.
Applied to files:
app/features/user/schema.ts
📚 Learning: 2025-02-21T08:38:40.338Z
Learnt from: yoannfleurydev
PR: BearStudio/start-ui-web#443
File: src/lib/s3/config.ts:13-13
Timestamp: 2025-02-21T08:38:40.338Z
Learning: In the avatar upload feature, the S3 key is intentionally set to `avatars/${user.id}` to ensure that new avatar uploads overwrite the previous one, preventing storage bloat and simplifying avatar management.
Applied to files:
app/server/routers/account.tsxapp/routes/api/upload.ts
🧬 Code graph analysis (4)
app/lib/s3/index.ts (1)
app/env/server.ts (1)
envServer(9-43)
app/features/account/user-card.tsx (2)
app/env/client.ts (1)
envClient(24-54)app/features/account/change-profile-picture-drawer.tsx (1)
ChangeProfilePictureDrawer(37-160)
app/server/routers/account.tsx (2)
app/env/server.ts (1)
envServer(9-43)app/lib/s3/index.ts (1)
s3client(5-12)
app/routes/api/upload.ts (3)
app/server/router.ts (1)
router(11-16)app/lib/s3/index.ts (1)
s3client(5-12)app/env/server.ts (1)
envServer(9-43)
🪛 GitHub Actions: 🔎 Code Quality
app/env/client.ts
[error] 24-24: Invalid environment variables: 'VITE_S3_BUCKET_PUBLIC_URL' is required but undefined.
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Playwright E2E Tests
🔇 Additional comments (2)
app/features/user/schema.ts (1)
27-27: LGTM: schema aligns with new avatar storage modelSwitch to
avatarFileIdis consistent with the backend and UI changes.app/routes/api/upload.ts (1)
12-21: Good: explicit image allowlist (SVG excluded)
| @@ -43,6 +43,7 @@ export const envClient = createEnv({ | |||
| .string() | |||
| .optional() | |||
| .transform((value) => value ?? (isDev ? 'gold' : 'plum')), | |||
| VITE_S3_BUCKET_PUBLIC_URL: z.string(), | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Build is red: make URL typed and ensure it’s provided in CI
VITE_S3_BUCKET_PUBLIC_URL is required but undefined in the pipeline. Also validate it as a URL.
Apply:
- VITE_S3_BUCKET_PUBLIC_URL: z.string(),
+ VITE_S3_BUCKET_PUBLIC_URL: z.string().url(),Follow‑ups:
- Add this var to .env.example, local envs, and CI (or set SKIP_ENV_VALIDATION only where acceptable).
- It must be an absolute base URL (e.g., https://cdn.example.com/bucket/).
Run:
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "Usages:"
rg -n --heading 'VITE_S3_BUCKET_PUBLIC_URL' app
echo -e "\nCheck .env example(s):"
fd -H '^\.env(\..+)?$' | while read -r f; do
echo "==> $f"
rg -n 'VITE_S3_BUCKET_PUBLIC_URL' "$f" || true
doneLength of output: 239
Require & validate VITE_S3_BUCKET_PUBLIC_URL and add it to envs/CI
- Change validation in app/env/client.ts:
- VITE_S3_BUCKET_PUBLIC_URL: z.string(),
+ VITE_S3_BUCKET_PUBLIC_URL: z.string().url(),- Add VITE_S3_BUCKET_PUBLIC_URL to .env.example and the pipeline/CI secrets with an absolute base URL.
- Usage note: app/features/account/user-card.tsx:21 builds
${envClient.VITE_S3_BUCKET_PUBLIC_URL}/${session.data?.user.avatarFileId}— normalize the base (no trailing slash) or update the join logic to avoid double slashes.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| VITE_S3_BUCKET_PUBLIC_URL: z.string(), | |
| VITE_S3_BUCKET_PUBLIC_URL: z.string().url(), |
🤖 Prompt for AI Agents
In app/env/client.ts around line 46 add validation for VITE_S3_BUCKET_PUBLIC_URL
(z.string()) so the client env includes the S3 public base URL; then update
.env.example and your CI/pipeline secrets to provide an absolute base URL;
finally ensure the value is normalized (strip any trailing slash) or update the
usage at app/features/account/user-card.tsx (where it builds
`${envClient.VITE_S3_BUCKET_PUBLIC_URL}/${session.data?.user.avatarFileId}`) to
safely join parts and avoid double slashes.
| export const s3client = new S3Client({ | ||
| endpoint: envServer.S3_ENDPOINT, | ||
| credentials: { | ||
| accessKeyId: envServer.S3_ACCESS_KEY_ID, | ||
| secretAccessKey: envServer.S3_SECRET_ACCESS_KEY, | ||
| }, | ||
| region: envServer.S3_REGION, | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Enable path-style access for MinIO and improve robustness
MinIO typically requires path-style addressing. Without it, requests may fail locally.
Apply:
export const s3client = new S3Client({
endpoint: envServer.S3_ENDPOINT,
+ forcePathStyle: true,
credentials: {
accessKeyId: envServer.S3_ACCESS_KEY_ID,
secretAccessKey: envServer.S3_SECRET_ACCESS_KEY,
},
region: envServer.S3_REGION,
+ maxAttempts: 3,
});📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export const s3client = new S3Client({ | |
| endpoint: envServer.S3_ENDPOINT, | |
| credentials: { | |
| accessKeyId: envServer.S3_ACCESS_KEY_ID, | |
| secretAccessKey: envServer.S3_SECRET_ACCESS_KEY, | |
| }, | |
| region: envServer.S3_REGION, | |
| }); | |
| export const s3client = new S3Client({ | |
| endpoint: envServer.S3_ENDPOINT, | |
| forcePathStyle: true, | |
| credentials: { | |
| accessKeyId: envServer.S3_ACCESS_KEY_ID, | |
| secretAccessKey: envServer.S3_SECRET_ACCESS_KEY, | |
| }, | |
| region: envServer.S3_REGION, | |
| maxAttempts: 3, | |
| }); |
🤖 Prompt for AI Agents
In app/lib/s3/index.ts around lines 5 to 12, the S3Client is missing path-style
addressing which MinIO requires; update the S3Client config to include
forcePathStyle: true so requests use path-style addressing for MinIO, and keep
the existing endpoint, credentials and region values intact to maintain current
behavior.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
.env.example (1)
27-29: DATABASE_URL references undefined DOCKER_DATABASE_ vars in this template.*Either reintroduce the vars or inline a self-contained local DSN to prevent confusion.
-# DATABASE -DATABASE_URL="postgres://${DOCKER_DATABASE_USERNAME}:${DOCKER_DATABASE_PASSWORD}@localhost:${DOCKER_DATABASE_PORT}/${DOCKER_DATABASE_NAME}" +# DATABASE +# Local example DSN (adjust as needed) +DATABASE_URL=postgres://postgres:postgres@localhost:5432/startuiAlternatively, add:
DOCKER_DATABASE_PORT=5432 DOCKER_DATABASE_NAME=startui DOCKER_DATABASE_USERNAME=postgres DOCKER_DATABASE_PASSWORD=postgres
♻️ Duplicate comments (2)
.env.example (2)
17-17: Public URL now targets API port — looks good. Ensure access policy.LGTM and resolves the earlier “console port” issue. If the bucket isn’t public, fetching
${VITE_S3_BUCKET_PUBLIC_URL}/${key}will 403; either set a public-read bucket policy for dev or use signed URLs.
9-14: Fix undefined var in S3_ENDPOINT; align ordering and quotes.DOCKER_MINIO_FORWARD_PORT isn’t defined here; use DOCKER_MINIO_API_PORT. Also unquote and reorder per linter.
-# S3 -S3_ENDPOINT="http://127.0.0.1:${DOCKER_MINIO_FORWARD_PORT}" -S3_BUCKET_NAME="start-ui-bucket" -S3_ACCESS_KEY_ID="startui-access-key" -S3_SECRET_ACCESS_KEY="startui-secret-key" -S3_REGION="default" +# S3 +S3_ACCESS_KEY_ID=startui-access-key +S3_BUCKET_NAME=start-ui-bucket +S3_ENDPOINT=http://127.0.0.1:${DOCKER_MINIO_API_PORT} +S3_REGION=default +S3_SECRET_ACCESS_KEY=startui-secret-key + +# If you expose MinIO via a forwarded port/reverse-proxy, document it separately to avoid collisions, e.g.: +# FORWARDED_S3_ENDPOINT=http://127.0.0.1:${DOCKER_MINIO_FORWARD_PORT}
🧹 Nitpick comments (2)
.env.example (2)
3-7: Unquote values and fix key order (dotenv-linter).Remove quotes for numbers/identifiers and order keys so PASSWORD comes before UI_PORT to satisfy lint hints.
# OBJECT STORAGE -DOCKER_MINIO_API_PORT="9000" -DOCKER_MINIO_UI_PORT="9001" -DOCKER_MINIO_USERNAME="startui" -DOCKER_MINIO_PASSWORD="startui" +DOCKER_MINIO_API_PORT=9000 +DOCKER_MINIO_USERNAME=startui +DOCKER_MINIO_PASSWORD=startui +DOCKER_MINIO_UI_PORT=9001
31-31: Fix grammar in comment.Change “to a generated secret” → “to generate a secret”.
-BETTER_AUTH_SECRET="REPLACE ME" # You can use `npx @better-auth/cli@latest secret` to a generated secret +BETTER_AUTH_SECRET="REPLACE ME" # You can use `npx @better-auth/cli@latest secret` to generate a secret
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
.env.example(2 hunks)
🧰 Additional context used
🧠 Learnings (1)
📓 Common learnings
Learnt from: yoannfleurydev
PR: BearStudio/start-ui-web#443
File: src/lib/s3/config.ts:13-13
Timestamp: 2025-02-21T08:38:40.338Z
Learning: In the avatar upload feature, the S3 key is intentionally set to `avatars/${user.id}` to ensure that new avatar uploads overwrite the previous one, preventing storage bloat and simplifying avatar management.
🪛 dotenv-linter (3.3.0)
.env.example
[warning] 4-4: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 5-5: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 6-6: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 7-7: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 7-7: [UnorderedKey] The DOCKER_MINIO_PASSWORD key should go before the DOCKER_MINIO_UI_PORT key
(UnorderedKey)
[warning] 11-11: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 11-11: [UnorderedKey] The S3_BUCKET_NAME key should go before the S3_ENDPOINT key
(UnorderedKey)
[warning] 12-12: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 12-12: [UnorderedKey] The S3_ACCESS_KEY_ID key should go before the S3_BUCKET_NAME key
(UnorderedKey)
[warning] 13-13: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 14-14: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 14-14: [UnorderedKey] The S3_REGION key should go before the S3_SECRET_ACCESS_KEY key
(UnorderedKey)
a750dda to
c49d5bf
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
♻️ Duplicate comments (4)
src/components/form/field-text/index.tsx (1)
47-76: File input is incorrectly controlled.For
type="file", spreading{...field}passes avalueprop to the native file input, which triggers React warnings and prevents React Hook Form from receiving theFile/File[]objects. File inputs must be uncontrolled.Apply this diff to handle file inputs specially:
<Controller name={name} control={control} disabled={disabled} - defaultValue={defaultValue} + defaultValue={type === 'file' ? undefined : defaultValue} shouldUnregister={shouldUnregister} render={({ field, fieldState }) => ( <div {...containerProps} className={cn( 'flex flex-1 flex-col gap-1', containerProps?.className )} > <Input type={type} id={ctx.id} aria-invalid={fieldState.error ? true : undefined} aria-describedby={ !fieldState.error ? `${ctx.descriptionId}` : `${ctx.descriptionId} ${ctx.errorId}` } {...rest} - {...field} - onChange={(e) => { - field.onChange(e); - rest.onChange?.(e); - }} - onBlur={(e) => { - field.onBlur(); - rest.onBlur?.(e); - }} + name={field.name} + ref={field.ref} + {...(type === 'file' + ? { + onChange: (e: React.ChangeEvent<HTMLInputElement>) => { + const files = Array.from(e.target.files ?? []); + field.onChange(rest.multiple ? files : files[0] ?? null); + rest.onChange?.(e); + }, + onBlur: (e: React.FocusEvent<HTMLInputElement>) => { + field.onBlur(); + rest.onBlur?.(e); + }, + } + : { + value: field.value, + onChange: (e: React.ChangeEvent<HTMLInputElement>) => { + field.onChange(e); + rest.onChange?.(e); + }, + onBlur: (e: React.FocusEvent<HTMLInputElement>) => { + field.onBlur(); + rest.onBlur?.(e); + }, + })} /> <FormFieldError /> </div> )} />docker-compose.yml (2)
18-34: Healthcheck references unavailablemccommand.The healthcheck runs
mc ready local, but the officialminio/minioimage doesn't include themcCLI tool.Use MinIO's built-in HTTP health endpoint instead:
healthcheck: - test: ['CMD', 'mc', 'ready', 'local'] + test: ['CMD-SHELL', 'curl -f http://localhost:9000/minio/health/ready || exit 1'] interval: 5s timeout: 5s retries: 5
35-52: Use internal container port in alias configuration.The
createbucketservice references${DOCKER_MINIO_API_PORT}when configuring the MinIO alias. Within the Docker Compose network, the service should connect to the container's internal port (9000) rather than the host-mapped port.Apply this diff:
command: ["sh", "-c", " - mc alias set default http://minio:${DOCKER_MINIO_API_PORT} \"${DOCKER_MINIO_USERNAME}\" \"${DOCKER_MINIO_PASSWORD}\"; + mc alias set default http://minio:9000 \"${DOCKER_MINIO_USERNAME}\" \"${DOCKER_MINIO_PASSWORD}\"; mc admin user add default ${S3_ACCESS_KEY_ID} ${S3_SECRET_ACCESS_KEY}; mc admin policy attach default readwrite --user ${S3_ACCESS_KEY_ID}; mc mb --ignore-existing default/${S3_BUCKET_NAME} 2>/dev/null; mc anonymous set download default/${S3_BUCKET_NAME}; echo 'Bucket configuration completed successfully'; " ]src/env/client.ts (1)
46-46: Pipeline failure: VITE_S3_BUCKET_PUBLIC_URL is undefined and lacks URL validation.The test environment fails because
VITE_S3_BUCKET_PUBLIC_URLis required but not provided. Additionally, the schema should validate it as a URL.Apply this diff to add URL validation:
- VITE_S3_BUCKET_PUBLIC_URL: z.string(), + VITE_S3_BUCKET_PUBLIC_URL: z.string().url(),Then add
VITE_S3_BUCKET_PUBLIC_URLto:
.env.examplewith a sample value (e.g.,http://localhost:9000/default)- Your CI environment variables with the appropriate public base URL
Note: Ensure the URL does not have a trailing slash, or update usages like
${envClient.VITE_S3_BUCKET_PUBLIC_URL}/${avatarFileId}to handle path joining correctly.
🧹 Nitpick comments (2)
src/features/account/user-card.tsx (1)
20-22: Add type safety to URL construction.The
getImageUrlfunction doesn't validate thatavatarFileIdexists before constructing the URL, which could produce malformed URLs if called whenavatarFileIdis undefined/null. While the current usage (line 31) checks before calling, the function itself should be defensive.Apply this diff to add a safety check:
- const getImageUrl = () => { - return `${envClient.VITE_S3_BUCKET_PUBLIC_URL}/${session.data?.user.avatarFileId}`; - }; + const getImageUrl = (fileId: string) => { + return `${envClient.VITE_S3_BUCKET_PUBLIC_URL}/${fileId}`; + };And update the call site:
- <AvatarImage - src={getImageUrl()} - alt={session.data?.user.name} - /> + <AvatarImage + src={getImageUrl(session.data.user.avatarFileId)} + alt={session.data?.user.name} + />src/lib/s3/index.ts (1)
1-12: Distinctly name S3 clients to prevent ambiguity.Both src/lib/s3/index.ts (AWS SDK) and src/lib/object-storage/index.ts (MinIO) export
s3client. Rename one or both exports (e.g.,awsS3Client,minioClient) and update import sites to clearly differentiate implementations.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (17)
.env.example(2 hunks)docker-compose.yml(1 hunks)package.json(3 hunks)prisma/schema.prisma(1 hunks)src/components/form/field-text/index.tsx(1 hunks)src/components/form/form-field-controller.tsx(1 hunks)src/env/client.ts(1 hunks)src/env/server.ts(1 hunks)src/features/account/change-profile-picture-drawer.tsx(1 hunks)src/features/account/schema.ts(1 hunks)src/features/account/user-card.tsx(1 hunks)src/features/user/schema.ts(1 hunks)src/lib/object-storage/index.ts(1 hunks)src/lib/s3/index.ts(1 hunks)src/locales/ar/account.json(1 hunks)src/locales/en/account.json(1 hunks)src/locales/fr/account.json(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- package.json
- prisma/schema.prisma
🧰 Additional context used
🧬 Code graph analysis (6)
src/features/account/user-card.tsx (2)
src/env/client.ts (1)
envClient(24-54)src/features/account/change-profile-picture-drawer.tsx (1)
ChangeProfilePictureDrawer(37-160)
src/features/account/schema.ts (1)
src/lib/zod/zod-utils.ts (1)
zu(4-92)
src/components/form/form-field-controller.tsx (1)
src/components/form/field-text/index.tsx (1)
FieldText(24-82)
src/lib/s3/index.ts (1)
src/env/server.ts (1)
envServer(9-44)
src/lib/object-storage/index.ts (1)
src/env/server.ts (1)
envServer(9-44)
src/features/account/change-profile-picture-drawer.tsx (2)
src/features/account/schema.ts (2)
FormFieldsAccountUpdateProfilePicture(13-15)zFormFieldsAccountUpdateProfilePicture(16-19)src/components/form/form-field-controller.tsx (1)
FormFieldController(64-131)
🪛 dotenv-linter (3.3.0)
.env.example
[warning] 9-9: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 10-10: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 11-11: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 12-12: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 12-12: [UnorderedKey] The DOCKER_MINIO_PASSWORD key should go before the DOCKER_MINIO_UI_PORT key
(UnorderedKey)
[warning] 16-16: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 16-16: [UnorderedKey] The S3_BUCKET_NAME key should go before the S3_ENDPOINT key
(UnorderedKey)
[warning] 17-17: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 17-17: [UnorderedKey] The S3_ACCESS_KEY_ID key should go before the S3_BUCKET_NAME key
(UnorderedKey)
[warning] 18-18: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 19-19: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 19-19: [UnorderedKey] The S3_REGION key should go before the S3_SECRET_ACCESS_KEY key
(UnorderedKey)
[warning] 23-23: [UnorderedKey] The VITE_BASE_URL key should go before the VITE_S3_BUCKET_PUBLIC_URL key
(UnorderedKey)
[warning] 37-37: [ValueWithoutQuotes] This value needs to be surrounded in quotes
(ValueWithoutQuotes)
🪛 GitHub Actions: 🔎 Code Quality
src/env/client.ts
[error] 24-25: Invalid environment variables: VITE_S3_BUCKET_PUBLIC_URL is required (environment check failed during test setup).
🪛 GitHub Actions: 🧪 E2E Tests
src/env/server.ts
[error] 9-9: Invalid environment variables: DOCKER_MINIO_USERNAME, DOCKER_MINIO_PASSWORD, S3_ACCESS_KEY_ID, S3_SECRET_ACCESS_KEY, S3_ENDPOINT are missing or invalid. Command 'pnpm db:seed' failed with exit code 1.
🔇 Additional comments (10)
src/components/form/form-field-controller.tsx (1)
108-109: LGTM!The new
filecase correctly routes toFieldText, mirroring the existing pattern fortext,teltypes.src/components/form/field-text/index.tsx (1)
19-19: Type extended to support file inputs.The union now includes
'file', enabling file input rendering through this component.src/features/account/schema.ts (1)
13-19: LGTM!The new profile picture update schema follows the established pattern and correctly validates
avatarFileIdas a non-empty string.src/locales/en/account.json (1)
24-30: Translations verified across locales changeProfilePictureDrawer entries exist in en, fr, ar, and sw locale files with matching keys.src/locales/fr/account.json (1)
24-31: LGTM!The French translations for the profile picture drawer are well-formed and consistent with the existing
changeNameDrawerpattern.src/locales/ar/account.json (1)
24-31: LGTM!The Arabic translations for the profile picture drawer are well-structured and follow the same pattern as the existing
changeNameDrawerblock.src/features/account/user-card.tsx (1)
27-52: LGTM!The avatar display logic correctly checks for
avatarFileIdexistence before rendering the image, and falls back toAvatarFallbackwhen not present. The integration withChangeProfilePictureDrawerprovides good UX..env.example (1)
9-22: Pipeline failure: Add env vars to CI/CD configuration.The E2E test pipeline is failing because the new MinIO and S3 environment variables are missing from the CI/CD environment. You'll need to add these variables to your GitHub Actions secrets/environment configuration.
Required variables for CI/CD:
DOCKER_MINIO_USERNAMEDOCKER_MINIO_PASSWORDS3_ACCESS_KEY_IDS3_SECRET_ACCESS_KEYS3_ENDPOINTsrc/features/account/change-profile-picture-drawer.tsx (1)
99-102: Verify mutation payload validation.The mutation receives
avatarFileId: avatarUpload.uploadedFile?.objectKey, which could beundefinedif the upload hasn't completed. While the disabled logic (lines 147-150) should prevent this, it would be safer to add a guard.Consider adding a check before mutation:
onSubmit={async () => { if (!avatarUpload.uploadedFile?.objectKey) { form.setError('avatarFileId', { message: t('account:changeProfilePictureDrawer.errorMessage'), }); return; } updateUser.mutate({ avatarFileId: avatarUpload.uploadedFile.objectKey, }); }}src/env/server.ts (1)
31-39: LGTM!The environment variable schema definitions are correct and follow security best practices by requiring explicit values for credentials (no defaults). The default values for ports, bucket name, and region are sensible.
Note: The pipeline failure is expected since these variables need to be added to your CI/CD configuration before the E2E tests can pass.
| const form = useForm<FormFieldsAccountUpdateProfilePicture>({ | ||
| resolver: zodResolver(zFormFieldsAccountUpdateProfilePicture()), | ||
| values: { | ||
| avatarFileId: '', | ||
| }, | ||
| }); | ||
|
|
||
| const avatarUpload = useUploadFile({ | ||
| route: 'avatar', | ||
| onError: (error) => { | ||
| form.setError('avatarFileId', { | ||
| message: error.message || 'An error occurred', | ||
| }); | ||
| }, | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Form field never updated after upload.
The form field avatarFileId is initialized to an empty string (line 46) and validated as a non-empty string by the schema, but the file input's onChange handler (lines 124-132) never updates this field value after a successful upload. This means the form will always be invalid, preventing submission even after a successful upload.
Apply this diff to update the field after successful upload:
const avatarUpload = useUploadFile({
route: 'avatar',
+ onSuccess: (file) => {
+ form.setValue('avatarFileId', file.objectKey, {
+ shouldValidate: true,
+ });
+ },
onError: (error) => {
form.setError('avatarFileId', {
message: error.message || 'An error occurred',
});
},
});📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const form = useForm<FormFieldsAccountUpdateProfilePicture>({ | |
| resolver: zodResolver(zFormFieldsAccountUpdateProfilePicture()), | |
| values: { | |
| avatarFileId: '', | |
| }, | |
| }); | |
| const avatarUpload = useUploadFile({ | |
| route: 'avatar', | |
| onError: (error) => { | |
| form.setError('avatarFileId', { | |
| message: error.message || 'An error occurred', | |
| }); | |
| }, | |
| }); | |
| const avatarUpload = useUploadFile({ | |
| route: 'avatar', | |
| onSuccess: (file) => { | |
| form.setValue('avatarFileId', file.objectKey, { | |
| shouldValidate: true, | |
| }); | |
| }, | |
| onError: (error) => { | |
| form.setError('avatarFileId', { | |
| message: error.message || 'An error occurred', | |
| }); | |
| }, | |
| }); |
🤖 Prompt for AI Agents
In src/features/account/change-profile-picture-drawer.tsx around lines 43 to 57,
the form field avatarFileId is initialized but never updated after a successful
file upload; modify the upload success handler to set the form field value
(e.g., call form.setValue('avatarFileId', <uploadedFileId>) with the ID returned
by the upload) and clear any previous form errors for avatarFileId so the form
becomes valid and submittable after upload.
src/features/user/schema.ts
Outdated
| createdAt: z.date(), | ||
| updatedAt: z.date(), | ||
| onboardedAt: z.date().nullish(), | ||
| avatarFileId: z.string().nullish(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Verify migration and breaking change impact.
The addition of avatarFileId (and removal of image per the summary) is a breaking change to the public User type. Ensure:
- A database migration exists to handle the schema change
- All references to the old
imagefield have been updated
Run the following script to verify the migration and check for any remaining image field references:
🏁 Script executed:
#!/bin/bash
# Description: Verify Prisma migration exists and check for stale image field usage
echo "==> Check for Prisma migration adding avatarFileId:"
fd -e sql . prisma/migrations --exec grep -l "avatarFileId" {} \;
echo -e "\n==> Check for any remaining references to user.image or user?.image:"
rg -n --type ts '\buser\.image\b|\buser\?\.\bimage\b' -g '!*.test.ts' -g '!*.spec.ts'
echo -e "\n==> Check User type usage in select/include statements:"
rg -n --type ts "image:" -C 2 | grep -A2 -B2 "select\|include"Length of output: 432
🏁 Script executed:
#!/bin/bash
# Locate Prisma schema file
echo "==> Locate schema.prisma:"
fd schema.prisma
# Search for avatarFileId in Prisma schema
echo -e "\n==> Search for avatarFileId in schema.prisma:"
rg -n "avatarFileId" $(fd schema.prisma)
# Locate any migrations directories
echo -e "\n==> Locate 'migrations' directories:"
fd -t d migrations
# Search for avatarFileId in migration files
echo -e "\n==> Search for avatarFileId in migration files:"
for dir in $(fd -t d migrations); do
rg -n "avatarFileId" "$dir" || true
done
# Search for avatarFileId across entire repo
echo -e "\n==> Search for avatarFileId across repo:"
rg -n "avatarFileId" || true
# Search for any remaining 'image' references (excluding tests)
echo -e "\n==> Search for 'user.image' or 'image:' references (excluding tests/specs):"
rg -n "(\.image\b|\?\.\image\b|image\s*:)" -g '!*.test.*' -g '!*.spec.*' || trueLength of output: 1773
🏁 Script executed:
#!/bin/bash
echo "==> Search for '.image' or '?.image' usages (excluding tests):"
rg -n "\.image\b|\?\.\image\b" -g '!*.test.*' -g '!*.spec.*' || echo "No .image patterns found"
echo -e "\n==> Search for keys named 'image:' (excluding tests):"
rg -n "image\s*:" -g '!*.test.*' -g '!*.spec.*' || echo "No 'image:' keys found"Length of output: 417
🏁 Script executed:
#!/bin/bash
echo "==> Searching for '.image' or '?.image' in TS/TSX (excluding tests/specs):"
rg -nP '(\.|\?\.)image\b' --glob '*.ts' --glob '*.tsx' --glob '!*.test.ts' --glob '!*.test.tsx' --glob '!*.spec.ts' --glob '!*.spec.tsx' || echo "No .image usages found"
echo -e "\n==> Searching for object keys 'image:' in TS/TSX (excluding tests/specs):"
rg -nP 'image\s*:' --glob '*.ts' --glob '*.tsx' --glob '!*.test.ts' --glob '!*.test.tsx' --glob '!*.spec.ts' --glob '!*.spec.tsx' || echo "No object key 'image:' found"Length of output: 264
Add migration for avatarFileId and update lingering image reference.
- No Prisma migration detected for the new
avatarFileIdfield; create and commit one. - Replace
session.data?.user.imagewithsession.data?.user.avatarFileIdin src/layout/manager/nav-user.tsx (line 52).
🤖 Prompt for AI Agents
In src/features/user/schema.ts around line 27 and across the schema/migrations,
the new avatarFileId field (avatarFileId: z.string().nullish()) was added but no
Prisma migration was created; generate a Prisma migration that adds the
avatarFileId column to the corresponding User model in schema.prisma and
run/commit the migration files. Also update the lingering reference in
src/layout/manager/nav-user.tsx at line 52 to use
session.data?.user.avatarFileId instead of session.data?.user.image so the UI
reads the new field. Ensure tests/build succeed and commit both the migration
and the nav-user.tsx change together.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (1)
src/routes/api/upload.ts (1)
18-20: Add common image formats and reduce max file size.The current configuration is missing common formats and has an excessive size limit:
- Missing formats:
image/gifandimage/avifare common avatar formats. AVIF in particular offers superior compression.- Size limit: 100MB is excessive for avatars. Typical avatar uploads are 5-10MB maximum, and reasonable limits improve UX and reduce resource usage.
Apply this diff to add missing formats and reduce the limit:
avatar: route({ - fileTypes: ['image/png', 'image/jpeg', 'image/webp'], - maxFileSize: 1024 * 1024 * 100, // 100Mb + fileTypes: ['image/png', 'image/jpeg', 'image/webp', 'image/gif', 'image/avif'], + maxFileSize: 1024 * 1024 * 10, // 10MB onBeforeUpload: async ({ req, file }) => {
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/routes/api/upload.ts(1 hunks)
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-02-21T08:38:40.338Z
Learnt from: yoannfleurydev
PR: BearStudio/start-ui-web#443
File: src/lib/s3/config.ts:13-13
Timestamp: 2025-02-21T08:38:40.338Z
Learning: In the avatar upload feature, the S3 key is intentionally set to `avatars/${user.id}` to ensure that new avatar uploads overwrite the previous one, preventing storage bloat and simplifying avatar management.
Applied to files:
src/routes/api/upload.ts
🧬 Code graph analysis (1)
src/routes/api/upload.ts (4)
src/server/router.ts (2)
router(11-16)Router(8-8)src/lib/s3/index.ts (1)
s3client(5-12)src/env/server.ts (1)
envServer(9-44)src/server/auth.tsx (1)
auth(23-100)
🪛 GitHub Check: 🧹 Linter
src/routes/api/upload.ts
[warning] 26-26:
Complete the task associated to this "TODO" comment
🔇 Additional comments (2)
src/routes/api/upload.ts (2)
1-12: LGTM!The imports are appropriate and align with the better-upload server integration and TanStack Router pattern.
44-50: LGTM!The route definition follows TanStack Router patterns correctly, with the server-side POST handler properly configured to delegate to
better-upload's request handler.
src/routes/api/upload.ts
Outdated
| // The message error thrown here is visible to the user | ||
| // as a field error message. | ||
| // [TODO] Check to send i18n key so the message can be translated | ||
| throw new RejectUpload('Not logged in!'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Complete the TODO: Implement i18n for error messages.
The error message 'Not logged in!' is hardcoded and visible to users as a field error. This prevents localization and is flagged by the TODO comment. Better-upload surfaces these messages in the UI, so they should be translatable.
Would you like me to help implement an i18n solution for error messages in upload validation? I can suggest a pattern that integrates with your existing translation system.
🧰 Tools
🪛 GitHub Check: 🧹 Linter
[warning] 26-26:
Complete the task associated to this "TODO" comment
🤖 Prompt for AI Agents
In src/routes/api/upload.ts around lines 24-27, replace the hardcoded error
message thrown to users with a translatable identifier: instead of throw new
RejectUpload('Not logged in!') emit a structured error that carries an i18n key
(e.g. 'upload.not_logged_in') and optional defaultMessage; update RejectUpload
to accept { i18nKey: string, defaultMessage?: string } (or add an i18nKey
property) so the frontend can resolve translations, and keep the defaultMessage
as a fallback for untranslated locales.
src/routes/api/upload.ts
Outdated
| const fileExtension = file.type.split('/').at(-1) as string; | ||
|
|
||
| return { | ||
| // Here we want to keep the filename always the same | ||
| // Each avatar upload will erase the previous one. | ||
| objectInfo: { | ||
| key: `${session.user.id}/avatar.${fileExtension}`, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: Object key includes extension, preventing overwrites.
The current key pattern ${session.user.id}/avatar.${fileExtension} includes the file extension, which means uploading a different format (e.g., PNG then WebP) creates separate objects (userId/avatar.png, userId/avatar.webp) instead of overwriting. This defeats the stated intent in lines 33-34 and contradicts the learning that keys should be deterministic to prevent storage bloat.
Additionally, line 30 extracts the extension without validating that file.type is well-formed, risking undefined behavior if the type is malformed.
Apply this diff to use a format-agnostic key that ensures overwrites:
- const fileExtension = file.type.split('/').at(-1) as string;
-
return {
// Here we want to keep the filename always the same
// Each avatar upload will erase the previous one.
objectInfo: {
- key: `${session.user.id}/avatar.${fileExtension}`,
+ key: `avatars/${session.user.id}`,
},
+ metadata: {
+ userId: session.user.id,
+ contentType: file.type,
+ },
};Based on learnings
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const fileExtension = file.type.split('/').at(-1) as string; | |
| return { | |
| // Here we want to keep the filename always the same | |
| // Each avatar upload will erase the previous one. | |
| objectInfo: { | |
| key: `${session.user.id}/avatar.${fileExtension}`, | |
| // (Remove the now-unused fileExtension extraction) | |
| return { | |
| // Here we want to keep the filename always the same | |
| // Each avatar upload will erase the previous one. | |
| objectInfo: { | |
| key: `avatars/${session.user.id}`, | |
| }, | |
| metadata: { | |
| userId: session.user.id, | |
| contentType: file.type, | |
| }, | |
| }; |
🤖 Prompt for AI Agents
In src/routes/api/upload.ts around lines 30 to 36, the object key currently
includes the file extension which prevents overwrites across different formats
and the file.type split isn't validated; change the key to a format-agnostic
deterministic value (e.g., `${session.user.id}/avatar`) so every upload replaces
the previous object, and stop deriving extension for the storage key;
additionally, validate file.type before using it (guard against undefined or
malformed values) and, if needed, extract and use the content type only for the
upload metadata (not the key), falling back to a safe default or returning a 400
error when the MIME type is invalid.
494eeda to
6c8b377
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 6
♻️ Duplicate comments (1)
src/components/form/field-text/index.tsx (1)
19-21: File input handling issue remains unresolved.Adding
'file'to the type union is necessary, but the underlying implementation issue flagged in the previous review has not been addressed. Spreading{...field}fortype="file"will pass avalueprop and forward the raw event, causing React warnings and preventing React Hook Form from receiving theFile/File[]correctly.Please refer to the previous review comment for the complete fix that handles file inputs specially by extracting files from
e.target.filesand passing them tofield.onChange.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (27)
.env.example(1 hunks)docker-compose.yml(1 hunks)package.json(3 hunks)prisma/schema.prisma(1 hunks)src/components/form/field-text/index.tsx(1 hunks)src/components/form/form-field-controller.tsx(1 hunks)src/env/client.ts(1 hunks)src/env/server.ts(1 hunks)src/features/account/schema.ts(1 hunks)src/features/account/user-card.tsx(2 hunks)src/features/book/book-cover.tsx(1 hunks)src/features/book/manager/form-book-cover.tsx(2 hunks)src/features/book/manager/page-book-update.tsx(2 hunks)src/features/book/schema.ts(1 hunks)src/features/user/schema.ts(1 hunks)src/lib/object-storage/index.ts(1 hunks)src/lib/s3/index.ts(1 hunks)src/locales/ar/account.json(1 hunks)src/locales/en/account.json(1 hunks)src/locales/fr/account.json(1 hunks)src/locales/sw/account.json(1 hunks)src/routeTree.gen.ts(11 hunks)src/routes/api/upload.ts(1 hunks)src/routes/app/account.index.tsx(1 hunks)src/routes/manager/account.index.tsx(1 hunks)src/server/routers/account.ts(1 hunks)src/server/routers/book.ts(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (10)
- src/env/client.ts
- prisma/schema.prisma
- src/features/account/schema.ts
- src/locales/en/account.json
- docker-compose.yml
- src/features/account/user-card.tsx
- src/features/user/schema.ts
- src/lib/object-storage/index.ts
- src/locales/ar/account.json
- package.json
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2024-10-14T15:29:53.279Z
Learnt from: ivan-dalmet
Repo: BearStudio/start-ui-web PR: 532
File: src/server/config/oauth/providers/discord.ts:11-11
Timestamp: 2024-10-14T15:29:53.279Z
Learning: In `src/server/config/oauth/providers/discord.ts`, when defining the `zDiscordUser` schema, keep fields like `username` as nullable (e.g., `.nullish()`), as the Discord API might return null values despite the documentation stating otherwise.
Applied to files:
src/server/routers/account.tssrc/features/book/schema.ts
📚 Learning: 2025-02-21T08:38:40.338Z
Learnt from: yoannfleurydev
Repo: BearStudio/start-ui-web PR: 443
File: src/lib/s3/config.ts:13-13
Timestamp: 2025-02-21T08:38:40.338Z
Learning: In the avatar upload feature, the S3 key is intentionally set to `avatars/${user.id}` to ensure that new avatar uploads overwrite the previous one, preventing storage bloat and simplifying avatar management.
Applied to files:
src/routes/api/upload.ts
🧬 Code graph analysis (10)
src/components/form/form-field-controller.tsx (1)
src/components/form/field-text/index.tsx (1)
FieldText(24-82)
src/features/book/manager/page-book-update.tsx (2)
src/components/form/field-number/docs.stories.tsx (1)
form(138-161)src/components/form/field-text/docs.stories.tsx (1)
form(84-112)
src/server/routers/book.ts (1)
src/server/routers/user.ts (1)
context(135-173)
src/lib/s3/index.ts (1)
src/env/server.ts (1)
envServer(9-50)
src/server/routers/account.ts (1)
src/server/routers/user.ts (1)
context(135-173)
src/features/book/book-cover.tsx (2)
src/features/book/schema.ts (1)
Book(8-8)src/env/client.ts (1)
envClient(24-53)
src/routes/api/upload.ts (4)
src/server/router.ts (2)
router(12-18)Router(9-9)src/lib/s3/index.ts (1)
s3client(5-12)src/env/server.ts (1)
envServer(9-50)src/server/auth.tsx (1)
auth(23-110)
src/features/book/manager/form-book-cover.tsx (1)
src/features/book/book-cover.tsx (1)
BookCover(8-56)
src/routes/app/account.index.tsx (3)
src/routes/manager/users/index.tsx (1)
RouteComponent(19-22)src/routes/login/verify.index.tsx (1)
RouteComponent(17-20)src/routes/manager/books/index.tsx (1)
RouteComponent(19-22)
.env.example (1)
vite.config.ts (1)
env(15-48)
🪛 dotenv-linter (4.0.0)
.env.example
[warning] 9-9: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 10-10: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 11-11: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 12-12: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 12-12: [UnorderedKey] The DOCKER_MINIO_PASSWORD key should go before the DOCKER_MINIO_UI_PORT key
(UnorderedKey)
[warning] 16-16: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 16-16: [UnorderedKey] The S3_BUCKET_NAME key should go before the S3_ENDPOINT key
(UnorderedKey)
[warning] 17-17: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 17-17: [UnorderedKey] The S3_ACCESS_KEY_ID key should go before the S3_BUCKET_NAME key
(UnorderedKey)
[warning] 18-18: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 19-19: [QuoteCharacter] The value has quote characters (', ")
(QuoteCharacter)
[warning] 19-19: [UnorderedKey] The S3_REGION key should go before the S3_SECRET_ACCESS_KEY key
(UnorderedKey)
[warning] 23-23: [UnorderedKey] The VITE_BASE_URL key should go before the VITE_S3_BUCKET_PUBLIC_URL key
(UnorderedKey)
🪛 GitHub Actions: 🔎 Code Quality
src/routes/api/upload.ts
[error] 43-43: ESLint: Complete the task associated to this "TODO" comment. (sonarjs/todo-tag)
src/features/book/manager/form-book-cover.tsx
[error] 37-37: ESLint: Complete the task associated to this "TODO" comment. (sonarjs/todo-tag)
🪛 GitHub Check: 🧹 Linter
src/routes/api/upload.ts
[warning] 43-43:
Complete the task associated to this "TODO" comment
src/features/book/manager/form-book-cover.tsx
[warning] 37-37:
Complete the task associated to this "TODO" comment
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Playwright E2E Tests
🔇 Additional comments (12)
src/server/routers/book.ts (1)
177-177: LGTM!The coverId field is properly added to the update payload with appropriate null handling, consistent with other optional fields like
publisher.src/locales/sw/account.json (1)
24-30: LGTM!The Swahili translations for the profile picture change drawer are properly structured and consistent with the pattern established in other locale files.
src/components/form/form-field-controller.tsx (1)
108-109: File type routing is correct but depends on unresolved FieldText issue.The new
'file'case correctly routes toFieldText, but note thatFieldTexthas an unresolved critical issue with file input handling (see review comment onsrc/components/form/field-text/index.tsx). Once that issue is resolved, this routing will function properly.src/features/book/schema.ts (2)
19-19: LGTM!The
coverIdfield is properly defined as a nullable string using.nullish(), consistent with the existing pattern for optional fields likepublisher.Based on learnings
25-25: LGTM!The
coverIdfield is correctly included in the form fields schema, enabling it to be used in book creation and update forms.src/locales/fr/account.json (1)
24-30: LGTM!The French translations for the profile picture change drawer are properly structured and consistent with other locale files.
.env.example (3)
9-19: LGTM!The MinIO and S3 configuration variables are properly defined with correct variable references. The previous critical issues have been addressed.
Note: Static analysis warnings about quote characters and key ordering are stylistic and can be safely ignored for an example file.
22-22: LGTM!The public S3 bucket URL is correctly configured using the MinIO API port and bucket name, making uploaded assets accessible to the client.
28-28: LGTM!The
VITE_ENV_COLORvariable is properly added for environment theming.src/features/book/manager/page-book-update.tsx (1)
42-42: LGTM!The
coverIdfield is correctly initialized in the form values with an appropriate fallback to an empty string.src/routeTree.gen.ts (1)
1-740: LGTM — Auto-generated route registration.This file is auto-generated by TanStack Router and correctly registers the new
/api/uploadroute across all route maps, type definitions, and the root route tree. The integration appears complete and follows the expected pattern for route registration.Note: Any issues with the route implementation should be addressed in the source file
src/routes/api/upload.ts, not in this generated file.src/env/server.ts (1)
37-45: S3 configuration is currently required and will cause startup failure if missing.The verification shows that:
S3 is unconditionally initialized —
src/lib/s3/index.tsinstantiatesS3Clientat module load time without conditional checks or error handling. The application cannot start without validS3_ACCESS_KEY_ID,S3_SECRET_ACCESS_KEY, andS3_ENDPOINT.Upload feature is tightly coupled —
src/routes/api/upload.tsdirectly imports and usess3clientwithout fallback logic.Inconsistent with GITHUB pattern — Unlike
GITHUB_CLIENT_IDandGITHUB_CLIENT_SECRET(which usezOptionalWithReplaceMe()to allow optional authentication), S3 variables lack this flexibility.DOCKER_MINIO variables appear unused —
DOCKER_MINIO_USERNAMEandDOCKER_MINIO_PASSWORDare validated insrc/env/server.tsbut do not appear to be referenced in the codebase. Verify if these should be removed or if integration is incomplete.If file upload is a core requirement, the current implementation is acceptable. If file upload should be optional, refactor S3 initialization to follow the GITHUB pattern with conditional imports and graceful degradation.
| const { upload, uploadedFile } = useUploadFile({ | ||
| route: 'bookCover', | ||
| onUploadComplete: ({ file }) => { | ||
| form.setValue('coverId', file.objectKey); | ||
| }, | ||
| }); | ||
|
|
||
| // [TODO] Handle upload errors | ||
|
|
||
| const genre = genresQuery.data?.items.find((item) => item.id === genreId); | ||
| return ( | ||
| <BookCover | ||
| book={{ | ||
| title, | ||
| author, | ||
| genre, | ||
| }} | ||
| /> | ||
| <div className="relative"> | ||
| <label htmlFor="coverId"> | ||
| <input | ||
| className="hidden" | ||
| id="coverId" | ||
| type="file" | ||
| name="coverId" | ||
| onChange={(e) => { | ||
| if (e.target.files?.[0]) { | ||
| upload(e.target.files[0]); | ||
| } | ||
| }} | ||
| /> | ||
| <input type="hidden" {...form.register('coverId')} /> | ||
| <BookCover |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Handle upload failures instead of leaving a TODO
The TODO at Line 37 is tripping the linter and, more importantly, we still surface no feedback if the upload promise rejects. Please remove the TODO by actually handling errors—catch the rejection and surface a validation message (ideally localized) so the form can react.
- onUploadComplete: ({ file }) => {
- form.setValue('coverId', file.objectKey);
- },
+ onUploadComplete: ({ file }) => {
+ form.clearErrors('coverId');
+ form.setValue('coverId', file.objectKey, { shouldDirty: true });
+ },
});
- // [TODO] Handle upload errors
+ const handleUpload = (file: File) => {
+ void upload(file).catch(() => {
+ form.setError('coverId', {
+ type: 'manual',
+ message: 'cover_upload_failed',
+ });
+ });
+ };
...
- upload(e.target.files[0]);
+ handleUpload(e.target.files[0]);Replace 'cover_upload_failed' with your localized copy or translation lookup.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const { upload, uploadedFile } = useUploadFile({ | |
| route: 'bookCover', | |
| onUploadComplete: ({ file }) => { | |
| form.setValue('coverId', file.objectKey); | |
| }, | |
| }); | |
| // [TODO] Handle upload errors | |
| const genre = genresQuery.data?.items.find((item) => item.id === genreId); | |
| return ( | |
| <BookCover | |
| book={{ | |
| title, | |
| author, | |
| genre, | |
| }} | |
| /> | |
| <div className="relative"> | |
| <label htmlFor="coverId"> | |
| <input | |
| className="hidden" | |
| id="coverId" | |
| type="file" | |
| name="coverId" | |
| onChange={(e) => { | |
| if (e.target.files?.[0]) { | |
| upload(e.target.files[0]); | |
| } | |
| }} | |
| /> | |
| <input type="hidden" {...form.register('coverId')} /> | |
| <BookCover | |
| const { upload, uploadedFile } = useUploadFile({ | |
| route: 'bookCover', | |
| onUploadComplete: ({ file }) => { | |
| form.clearErrors('coverId'); | |
| form.setValue('coverId', file.objectKey, { shouldDirty: true }); | |
| }, | |
| }); | |
| const handleUpload = (file: File) => { | |
| void upload(file).catch(() => { | |
| form.setError('coverId', { | |
| type: 'manual', | |
| message: 'cover_upload_failed', | |
| }); | |
| }); | |
| }; | |
| const genre = genresQuery.data?.items.find((item) => item.id === genreId); | |
| return ( | |
| <div className="relative"> | |
| <label htmlFor="coverId"> | |
| <input | |
| className="hidden" | |
| id="coverId" | |
| type="file" | |
| name="coverId" | |
| onChange={(e) => { | |
| if (e.target.files?.[0]) { | |
| handleUpload(e.target.files[0]); | |
| } | |
| }} | |
| /> | |
| <input type="hidden" {...form.register('coverId')} /> | |
| <BookCover |
🧰 Tools
🪛 GitHub Actions: 🔎 Code Quality
[error] 37-37: ESLint: Complete the task associated to this "TODO" comment. (sonarjs/todo-tag)
🪛 GitHub Check: 🧹 Linter
[warning] 37-37:
Complete the task associated to this "TODO" comment
🤖 Prompt for AI Agents
In src/features/book/manager/form-book-cover.tsx around lines 30 to 55, replace
the TODO by catching upload failures and surfacing a form validation error: wrap
the upload call in a try/catch (or handle the returned promise) and on failure
call form.setError('coverId', { type: 'manual', message:
t('cover_upload_failed') }) (or your i18n lookup) so the UI shows a localized
error; also ensure any previous error is cleared on successful upload with
form.clearErrors('coverId') and remove the TODO comment.
| <Form | ||
| {...form} | ||
| onSubmit={async (values) => { | ||
| console.log({ values }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove debug console.log.
This console.log statement should be removed before merging to production.
Apply this diff:
- console.log({ values });
bookUpdate.mutate({ id: props.params.id, ...values });📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| console.log({ values }); | |
| bookUpdate.mutate({ id: props.params.id, ...values }); |
🤖 Prompt for AI Agents
In src/features/book/manager/page-book-update.tsx around line 85, remove the
debug console.log({ values }); statement: delete that line entirely to avoid
leaking debug output in production, and if needed replace with proper logging
via the app's logger or remove without substitution.
| export const s3client = new S3Client({ | ||
| endpoint: envServer.S3_ENDPOINT, | ||
| credentials: { | ||
| accessKeyId: envServer.S3_ACCESS_KEY_ID, | ||
| secretAccessKey: envServer.S3_SECRET_ACCESS_KEY, | ||
| }, | ||
| region: envServer.S3_REGION, | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Enable path-style requests for MinIO
When targeting MinIO (or any custom endpoint with a non-S3 domain/port), the SDK must use path-style URLs; otherwise it will try bucket.endpoint hostnames like default.localhost:9000, which fail to resolve. Add forcePathStyle: true to keep uploads working.
export const s3client = new S3Client({
endpoint: envServer.S3_ENDPOINT,
+ forcePathStyle: true,
credentials: {
accessKeyId: envServer.S3_ACCESS_KEY_ID,
secretAccessKey: envServer.S3_SECRET_ACCESS_KEY,
},
region: envServer.S3_REGION,
});📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export const s3client = new S3Client({ | |
| endpoint: envServer.S3_ENDPOINT, | |
| credentials: { | |
| accessKeyId: envServer.S3_ACCESS_KEY_ID, | |
| secretAccessKey: envServer.S3_SECRET_ACCESS_KEY, | |
| }, | |
| region: envServer.S3_REGION, | |
| }); | |
| export const s3client = new S3Client({ | |
| endpoint: envServer.S3_ENDPOINT, | |
| forcePathStyle: true, | |
| credentials: { | |
| accessKeyId: envServer.S3_ACCESS_KEY_ID, | |
| secretAccessKey: envServer.S3_SECRET_ACCESS_KEY, | |
| }, | |
| region: envServer.S3_REGION, | |
| }); |
🤖 Prompt for AI Agents
In src/lib/s3/index.ts around lines 5 to 12, the S3Client configuration lacks
path-style requests, which breaks usage against MinIO/custom endpoints; update
the S3Client constructor options to include forcePathStyle: true so the SDK uses
path-style URLs (bucket in path) instead of virtual-hosted style (bucket as
hostname), ensuring uploads work with MinIO and non-S3 endpoints.
| // normalize file extension from detected mimetype | ||
| const fileExtension = file.type.split('/').at(-1) as string; | ||
| return { | ||
| // I think it is a good idea to create a random file id as it impersonate the file name (which can contains sensitive data :/ ?) | ||
| // [TODO] Add a way to clean unused documents | ||
| objectInfo: { | ||
| key: `books/${crypto.randomUUID()}.${fileExtension}`, | ||
| }, | ||
| }; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Replace the TODO with real cleanup handling
The [TODO] Add a way to clean unused documents note is breaking the SonarJS TODO check and blocks the pipeline. Please either implement the cleanup strategy now or link to a tracked issue and rephrase the comment to whatever tag your lint rules permit (e.g., FIXME(ISSUE-123) or documented backlog reference), so the linter passes.
🧰 Tools
🪛 GitHub Actions: 🔎 Code Quality
[error] 43-43: ESLint: Complete the task associated to this "TODO" comment. (sonarjs/todo-tag)
🪛 GitHub Check: 🧹 Linter
[warning] 43-43:
Complete the task associated to this "TODO" comment
🤖 Prompt for AI Agents
In src/routes/api/upload.ts around lines 39-47, the inline TODO "Add a way to
clean unused documents" is tripping SonarJS and blocking the pipeline; either
implement cleanup now or convert the note to an allowed lint tag with a tracked
issue. To implement: persist upload metadata (key, upload timestamp, owner/ref)
and create a background cleanup task (or use an object-store lifecycle rule) to
delete unreferenced objects older than a configured TTL; ensure the upload path
records a reference flag that is cleared when a document is permanently unused
so the cleaner can remove it. Alternatively, if you cannot implement it now,
replace the TODO with an allowed tag and issue reference, e.g. comment
"FIXME(ISSUE-123): cleanup of unused documents tracked at
https://tracker/ISSUE-123", and add the issue to your backlog so the linter
passes.
- add checks on user permissions - randomized file name (not sure it is the right thing to do)
6c8b377 to
821d64a
Compare
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (2)
src/features/book/manager/form-book-cover.tsx (1)
30-55: Handle upload failures and drop the lingering TODO
uploadrejections currently vanish, leaving the form stuck with a hidden field error and the TODO is failing the lint pipeline. Please wrap the upload call so you clear previous errors on success, mark the field as dirty, and surface a localized validation error on failure (then remove the TODO). A minimal fix looks like:-import { useUploadFile } from 'better-upload/client'; +import { useUploadFile } from 'better-upload/client'; +import { useTranslation } from 'react-i18next'; … export const FormBookCover = () => { const form = useFormContext<FormFieldsBook>(); + const { t } = useTranslation(['book']); … - const { upload, uploadedFile } = useUploadFile({ + const { upload, uploadedFile } = useUploadFile({ route: 'bookCover', onUploadComplete: ({ file }) => { - form.setValue('coverId', file.objectKey); + form.clearErrors('coverId'); + form.setValue('coverId', file.objectKey, { shouldDirty: true }); }, }); - // [TODO] Handle upload errors + const handleUpload = (file: File) => { + void upload(file).catch(() => { + form.setError('coverId', { + type: 'manual', + message: t('book:form.coverUploadFailed'), + }); + }); + }; … - upload(e.target.files[0]); + handleUpload(e.target.files[0]);Pick whatever translation key fits your namespace, but the important part is to stop swallowing errors and unblock lint.
src/routes/api/upload.ts (1)
39-46: Resolve the storage-cleanup TODO and random key leakEvery cover upload generates a brand-new
books/${uuid}object; without the promised cleanup this leaks unused blobs indefinitely, and the[TODO]keeps blocking the lint job. Either switch to a deterministic key that overwrites (e.g. tie it to the eventual book identifier) or implement the cleanup path now (scheduled job or lifecycle rule) and remove the TODO, referencing the tracked issue if you defer work. This was exactly the concern called out in our previous avatar upload work—let’s keep storage under control. Based on learnings
🧹 Nitpick comments (1)
src/server/routers/account.ts (1)
47-47: Cosmetic whitespace change.This added blank line has no functional impact.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (10)
package.json(3 hunks)prisma/schema.prisma(1 hunks)src/features/book/book-cover.tsx(1 hunks)src/features/book/manager/form-book-cover.tsx(2 hunks)src/features/book/manager/page-book-update.tsx(1 hunks)src/features/book/schema.ts(1 hunks)src/features/user/schema.ts(0 hunks)src/routes/api/upload.ts(1 hunks)src/server/routers/account.ts(1 hunks)src/server/routers/book.ts(1 hunks)
💤 Files with no reviewable changes (1)
- src/features/user/schema.ts
🚧 Files skipped from review as they are similar to previous changes (5)
- src/features/book/manager/page-book-update.tsx
- src/features/book/schema.ts
- package.json
- prisma/schema.prisma
- src/server/routers/book.ts
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-02-21T08:38:40.338Z
Learnt from: yoannfleurydev
Repo: BearStudio/start-ui-web PR: 443
File: src/lib/s3/config.ts:13-13
Timestamp: 2025-02-21T08:38:40.338Z
Learning: In the avatar upload feature, the S3 key is intentionally set to `avatars/${user.id}` to ensure that new avatar uploads overwrite the previous one, preventing storage bloat and simplifying avatar management.
Applied to files:
src/server/routers/account.tssrc/routes/api/upload.ts
🧬 Code graph analysis (4)
src/features/book/book-cover.tsx (2)
src/features/book/schema.ts (1)
Book(8-8)src/env/client.ts (1)
envClient(24-53)
src/features/book/manager/form-book-cover.tsx (1)
src/features/book/book-cover.tsx (1)
BookCover(9-62)
src/server/routers/account.ts (1)
src/server/routers/user.ts (1)
context(135-173)
src/routes/api/upload.ts (4)
src/server/router.ts (2)
router(12-18)Router(9-9)src/lib/s3/index.ts (1)
s3client(5-12)src/env/server.ts (1)
envServer(9-50)src/server/auth.tsx (1)
auth(23-110)
🪛 GitHub Actions: 🔎 Code Quality
src/features/book/manager/form-book-cover.tsx
[warning] 37-37: ESLint: sonarjs/todo-tag - Complete the task associated to this "TODO" comment.
src/routes/api/upload.ts
[warning] 43-43: ESLint: sonarjs/todo-tag - Complete the task associated to this "TODO" comment.
🪛 GitHub Check: 🧹 Linter
src/features/book/manager/form-book-cover.tsx
[warning] 37-37:
Complete the task associated to this "TODO" comment
src/routes/api/upload.ts
[warning] 43-43:
Complete the task associated to this "TODO" comment
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Playwright E2E Tests
| onBeforeUpload: async ({ req, file }) => { | ||
| const session = await auth.api.getSession({ headers: req.headers }); | ||
|
|
||
| // Only admins should be able to update book covers | ||
| const canUpdateBookCover = await auth.api.userHasPermission({ | ||
| body: { | ||
| userId: session?.user.id, | ||
| permissions: { | ||
| book: ['create', 'update'], | ||
| }, | ||
| role: 'admin', | ||
| }, | ||
| }); | ||
|
|
||
| if (!canUpdateBookCover.success) { | ||
| throw new RejectUpload('Not authorized'); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Check the session before the permission call
When the request is unauthenticated, session is null and we still call auth.api.userHasPermission with an undefined userId. That call will fail its schema validation and you end up replying 500 instead of a clean 401. Guard the session first and reuse the validated session.user.id:
onBeforeUpload: async ({ req, file }) => {
const session = await auth.api.getSession({ headers: req.headers });
+ if (!session?.user) {
+ throw new RejectUpload('upload.not_authenticated');
+ }
// Only admins should be able to update book covers
const canUpdateBookCover = await auth.api.userHasPermission({
body: {
- userId: session?.user.id,
+ userId: session.user.id,Return an i18n-friendly key in the RejectUpload while you’re there so the client can translate it.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| onBeforeUpload: async ({ req, file }) => { | |
| const session = await auth.api.getSession({ headers: req.headers }); | |
| // Only admins should be able to update book covers | |
| const canUpdateBookCover = await auth.api.userHasPermission({ | |
| body: { | |
| userId: session?.user.id, | |
| permissions: { | |
| book: ['create', 'update'], | |
| }, | |
| role: 'admin', | |
| }, | |
| }); | |
| if (!canUpdateBookCover.success) { | |
| throw new RejectUpload('Not authorized'); | |
| } | |
| onBeforeUpload: async ({ req, file }) => { | |
| const session = await auth.api.getSession({ headers: req.headers }); | |
| if (!session?.user) { | |
| throw new RejectUpload('upload.not_authenticated'); | |
| } | |
| // Only admins should be able to update book covers | |
| const canUpdateBookCover = await auth.api.userHasPermission({ | |
| body: { | |
| userId: session.user.id, | |
| permissions: { | |
| book: ['create', 'update'], | |
| }, | |
| role: 'admin', | |
| }, | |
| }); | |
| if (!canUpdateBookCover.success) { | |
| throw new RejectUpload('Not authorized'); | |
| } |
🤖 Prompt for AI Agents
In src/routes/api/upload.ts around lines 21 to 37, the code calls
auth.api.userHasPermission with session?.user.id even when session may be null,
which causes schema validation errors and a 500; first check that session exists
and if not immediately throw a RejectUpload with an i18n-friendly key (e.g.
'error.unauthorized' or 'upload.not_authenticated'), then use the validated
session.user.id for the userHasPermission call; also return an i18n-friendly key
in the RejectUpload for the authorization failure (e.g. 'upload.not_authorized')
so the client can translate it.
|
See #634 I am closing this PR because I have ts errors on the eslint.config.ts file, and I'm unable to fix them The linked PR contains all code needed for the file upload to work |



Describe your changes
Setup basic file upload on startui
Screenshots
CleanShot.2025-09-12.at.15.30.32.mp4
In the db, we store the path to the uploaded file (here an avatar)

Checklist
pnpm storybookcommand and everything is working(If you cannot update the french language, just let us know in the PR description)
Summary by CodeRabbit
Release Notes
New Features
Chores